Penalized AdaBoost: Improving the Generalization Error of Gentle AdaBoost through a Margin Distribution

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Real Generalization of Discrete AdaBoost

Scaling discrete AdaBoost to handle real-valued weak hypotheses has often been done under the auspices of convex optimization, but little is generally known from the original boosting model standpoint. We introduce a novel generalization of discrete AdaBoost which departs from this mainstream of algorithms. From the theoretical standpoint, it formally displays the original boosting property, as...

متن کامل

Face Detection Using Look-Up Table Based Gentle AdaBoost

In this work, we propose a face detection method based on the Gentle AdaBoost algorithm which is used for construction of binary tree structured strong classifiers. Gentle AdaBoost algorithm update values are constructed by using the difference of the conditional class probabilities for the given value of Haar features proposed by [1]. By using this approach, a classifier which can model image ...

متن کامل

Obscenity Detection Using Haar-Like Features and Gentle Adaboost Classifier

Large exposure of skin area of an image is considered obscene. This only fact may lead to many false images having skin-like objects and may not detect those images which have partially exposed skin area but have exposed erotogenic human body parts. This paper presents a novel method for detecting nipples from pornographic image contents. Nipple is considered as an erotogenic organ to identify ...

متن کامل

Error Bounds for Aggressive and Conservative AdaBoost

Three AdaBoost variants are distinguished based on the strategies applied to update the weights for each new ensemble member. The classic AdaBoost due to Freund and Schapire only decreases the weights of the correctly classified objects and is conservative in this sense. All the weights are then updated through a normalization step. Other AdaBoost variants in the literature update all the weigh...

متن کامل

‘Modest AdaBoost’ – Teaching AdaBoost to Generalize Better

Boosting is a technique of combining a set weak classifiers to form one high-performance prediction rule. Boosting was successfully applied to solve the problems of object detection, text analysis, data mining and etc. The most and widely used boosting algorithm is AdaBoost and its later more effective variations Gentle and Real AdaBoost. In this article we propose a new boosting algorithm, whi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEICE Transactions on Information and Systems

سال: 2015

ISSN: 0916-8532,1745-1361

DOI: 10.1587/transinf.2015edp7069